Goto

Collaborating Authors

 perturbed generative model


Emergence of Object Segmentation in Perturbed Generative Models

Neural Information Processing Systems

We introduce a novel framework to build a model that can learn how to segment objects from a collection of images without any human annotation. Our method builds on the observation that the location of object segments can be perturbed locally relative to a given background without affecting the realism of a scene. Our approach is to first train a generative model of a layered scene. The layered representation consists of a background image, a foreground image and the mask of the foreground. A composite image is then obtained by overlaying the masked foreground image onto the background.


Reviews: Emergence of Object Segmentation in Perturbed Generative Models

Neural Information Processing Systems

This paper presents a generative model of layered object representation where a generator synthesizes a foreground object, a foreground mask and a background to compose an image at the same time while a discriminator tells the composite's realism. To prevent the generator from cheating, i.e. generating the same foreground and background with a random mask, this paper proposes to add random shift to the generated foreground object and its mask. The rationale behind this is that the generated foreground object and its mask must be valid to allow such random shifts while maintaining the realism of the scene. In other words, the foreground can move independently on the background in a valid layered scene. As a result, this paper discovered that a high-quality foreground mask emerges from this layered generative model so that an encoder is trained to predict the mask from an image for object segmentation.


Reviews: Emergence of Object Segmentation in Perturbed Generative Models

Neural Information Processing Systems

The paper presents a scheme for learning object segmentation from a set of image data without annotation. The main assumption upon which the approach is built is that the location of object segments can be perturbed relative to background. The method is shown to empirically improve segmentation performance on real images of several object categories. All reviewers have found the contributions of this work significant and interesting, both in terms of the methodology (simple but original) and empirical results. Please consider the improvements suggested by reviewers in the final version.


Emergence of Object Segmentation in Perturbed Generative Models

Neural Information Processing Systems

We introduce a novel framework to build a model that can learn how to segment objects from a collection of images without any human annotation. Our method builds on the observation that the location of object segments can be perturbed locally relative to a given background without affecting the realism of a scene. Our approach is to first train a generative model of a layered scene. The layered representation consists of a background image, a foreground image and the mask of the foreground. A composite image is then obtained by overlaying the masked foreground image onto the background.


Emergence of Object Segmentation in Perturbed Generative Models

Bielski, Adam, Favaro, Paolo

Neural Information Processing Systems

We introduce a novel framework to build a model that can learn how to segment objects from a collection of images without any human annotation. Our method builds on the observation that the location of object segments can be perturbed locally relative to a given background without affecting the realism of a scene. Our approach is to first train a generative model of a layered scene. The layered representation consists of a background image, a foreground image and the mask of the foreground. A composite image is then obtained by overlaying the masked foreground image onto the background. The generative model is trained in an adversarial fashion against a discriminator, which forces the generative model to produce realistic composite images.